If you’re a webmaster, you probably received one of those infamous βGooglebot cannot access CSS and JS files on example.comβ warning letters that Google sent out to seemingly every SEO and webmaster. This was a brand new alert from Google, although we have been hearing from the search engine about the need to ensure all resources are unblockedβincluding both JavaScript and CSS.
There was definite confusion around these letters, supported by some of the reporting in Google Search Console. Here’s what you need to know about Googleβs desire to see these resources unblocked and how you can easily unblock them to take advantage of the associated ranking boosts.
Why does Google care?
One of the biggest complaints about the warning emails lay in the fact that many felt there was no reason for Google to see these files. This was especially true because it was flagging files that, traditionally, webmasters blockedβsuch as files within the WordPress admin area and Wordpress plugin folders.
Here’s the letter in question that many received from Google. It definitely raised plenty of questions and concerns:
Of course, whenever Google does anything that could devalue rankings, the SEO industry tends to freak out. And the confusing message in the warning didnβt help the situation.
Why Google needs it
Google needs to render these files for a couple of key reasons. The most visible and well known is the mobile-friendly algorithm. Google needs to be able to render the page completely, including the JavaScript and CSS, to ensure that the page is mobile-friendly and to apply both the mobile-friendly tag in the search results and the associated ranking boost for mobile search results. Unblocking these resources was one of the things that Google was publicly recommending to webmasters to get the mobile-friendly boost for those pages.
However, there are other parts of the algorithm that rely on using it, as well. The page layout algorithm, the algorithm that looks at where content is placed on the page in relation to the advertisements, is one such example. If Google determines a webpage is mostly ads above the fold, with the actual content below the fold, it can devalue the rankings for those pages. But with the wizardry of CSS, webmasters can easily make it appear that the content is front and center, while the ads are the most visible part of the page above the fold.
And while itβs an old school trick and not very effective, people still use CSS and JavaScript in order to hide things like keyword stuffing and linksβincluding, in the case of a hacked site, to hide it from the actual website owner. Googlebot crawling the CSS and JavaScript can determine if it is being used spammily.
Google also has hundreds of other signals in their search algo, and it is very likely that a few of those use data garnered from CSS and JavaScript in some fashion as well. And as Google changes things, there is always the possibility that Google will use it for future signals, as well.
Why now?
While many SEOs had their first introduction to the perils of blocking JavaScript and CSS when they received the email from Google, Matt Cutts was actually talking about it three-and-a-half years ago in a Google Webmaster Help video.
Then, last year, Google made a significant change to their webmaster guidelines by adding it to their technical guidelines:
Disallowing crawling of Javascript or CSS files in your siteβs robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.
It still got very little attention at the time, especially since most people believed they werenβt blocking anything.
However, one major issue was that some popular SEO Wordpress plugins were blocking some JavaScript and CSS. Since most Wordpress users werenβt aware this was happening, it came as a surprise to learn that they were, in fact, blocking resources.
It also began showing up in a new “Blocked Resources” section of Google Search Console in the month preceding the mobile-friendly algo launch.
How many sites were affected?
In usual Google fashion, they didnβt give specific numbers about how many webmasters received these blocked resources warnings. But Gary Illyes from Google did confirm that they were sent out to 18.7% of those that were sent out for the mobile-friendly warnings earlier this year:
@jenstar about 18.7% of that sent for mobile issues a few months back
β Gary Illyes (@methode) July 29, 2015
Finding blocked resources
The email that Google sent to webmasters alerting them to the issue of blocked CSS and JavaScript was confusing. It left many webmasters unsure of what exactly was being blocked and what was blocking it, particularly because they were receiving warnings for JavaScript and CSS hosted on other third-party sites.
If you received one of the warning letters, the suggestion for how to find blocked resources was to use the Fetch tool in Google Search Console. While this might be fine for checking the homepage, for sites with more than a handful of pages, this can get tedious quite quickly. Luckily, there’s an easier way than Google’s suggested method.
There’s a full walkthrough here, but for those familiar with Google Search Console, you’ll find a section called βBlocked Resourcesβ under the βGoogle Indexβ which will tell you what JavaScript and CSS is blocked and what pages they’re found in.
You also should make sure that you check for blocked resources after any major redesign or when launching a new site, as it isnβt entirely clear if Google is still actively sending out these emails to alert webmasters of the problem.
Homepage
There’s been some concern about those who use specialized scripts on internal pages and donβt necessarily want to unblock them for security reasons. John Mueller from Google said that they are looking primarily at the homepageβboth desktop and mobileβto see what JavaScript and CSS are blocked.
So at least for now, while it is certainly a best practice to unblock CSS and JavaScript from all pages, at the very least you want to make it a priority for the homepage, ensuring nothing on that page is blocked. After that, you can work your way through other pages, paying special attention to pages that have unique JavaScript or CSS.
Indexing of Javascript & CSS
Another reason many sites give for not wanting to unblock their CSS and JavaScript is because they donβt want them to be indexed by Google. But neither of those files are file types that Google will index, according to their long list of supported file types for indexation.
All variations
It is also worth remembering to check both the www and the non-www for blocked resources in Google Search Console. This is something that is often overlooked by those webmasters that only to tend to look at the version they prefer to use for the site.
Also, because the blocked resources data shown in Search Console is based on when Googlebot last crawled each page, you could find additional blocked resources when checking them both. This is especially true for for sites that may be older or not updated as frequently, and not crawled daily (like a more popular site is).
Likewise, if you have both a mobile version and a desktop version, you’ll want to ensure that both are not blocking any resources. It’s especially important for the mobile version, since it impacts whether each page gets the mobile-friendly tag and ranking boost in the mobile search results.
And if you serve different pages based on language and location, you’ll want to check each of those as well. Donβt just check the βmainβ version and assume it’s all good across the entire site. It’s not uncommon to discover surprises in other variations of the same site. At the very least, check the homepage for each language and location.
Wordpress and blocking Javascript & CSS
If you use one of the “SEO for Wordpress”-type plugins for a Wordpress-based site, chances are you’re blocking Javascript and CSS due to that plugin. It used to be one of the βout-of-the-boxβ default settings for some to block everything in the /wp-admin/ folder.
When the mobile-friendly algo came into play, because those admin pages were not being individually indexed, the majority of Wordpress users left that robots block intact. But this new Google warning does require all Wordpress-related JavaScript and CSS be unblocked, and Google will show it as an error if you block the JavaScript and CSS.
Yoast, creator of the popular Yoast SEO plugin (formerly Wordpress SEO), also recommends unblocking all the JavaScript and CSS in Wordpress, including the /wp-admin/ folder.
Third-party resources
One of the ironies of this was that Google was flagging third-party JavaScript, meaning JavaScript hosted on a third-party site that was called from each webpage. And yes, this includes Googleβs own Google AdSense JavaScript.
Initially, Google suggested that website owners contact those third-party sites to ask them to unblock the JavaScript being used, so that Googlebot could crawl it. However, not many webmasters were doing this; they felt it wasnβt their job, especially when they had no control over what a third-party sites blocks from crawling.
Google later said that they were not concerned about third-party resources because of that lack of control webmasters have. So while it might come up on the blocked resources list, they are truly looking for URLs for both JavaScript and CSS that the website owner can control through their own robots.txt.
John Mueller revealed more recently that they were planning to reach out to some of the more frequently cited third-party sites in order to see if they could unblock the JavaScript. While we donβt know which sites they intend to contact, it was something they planned to do; I suspect they’ll successfully see some of them unblocked. Again, while this isnβt so much a webmaster problem, it’ll be nice to have some of those sites no longer flagged in the reports.
How to unblock your JavaScript and CSS
For most users, it’s just a case of checking the robots.txt and ensuring you’re allowing all JavaScript and CSS files to be crawled. For Yoast SEO users, you can edit your robots.txt file directly in the admin area of Wordpress.
Gary Illyes from Google also shared some detailed robots.txt changes on Stack Overflow. You can add these directives to your robots.txt file in order to allow Googlebot to crawl all Javascript and CSS.
To be doubly sure you’re unblocking all JavaScript and CSS, you can add the following to your robots.txt file, provided you don’t have any directories being blocked in it already:
User-Agent: Googlebot Allow: .js Allow: .css
If you have a more specialized robots.txt file, where you’re blocking entire directories, it can be a bit more complicated.
In these cases, you also need to allow the .js and.css for each of the directories you have blocked.
For example:
User-Agent: Googlebot Disallow: /deep/ Allow: /deep/*.js Allow: /deep/*.css
Repeat this for each directory you are blocking in robots.txt.
This allows Googlebot to crawl those files, while disallowing other crawlers (if you’ve blocked them). However, the chances are good that the kind of bots you’re most concerned about being allowed to crawl various JavaScript and CSS files aren’t the ones that honor robots.txt files.
You can change the User-Agent to *, which would allow all crawlers to crawl it. Bing does have its own version of the mobile-friendly algo, which requires crawling of JavaScript and CSS, although they haven’t sent out warnings about it.
Bottom line
If you want to rank as well as you possibly can, unblocking JavaScript and CSS is one of the easiest SEO changes you can make to your site. This is especially important for those with a significant amount of mobile traffic, since the mobile ranking algorithm does require they both be unblocked to get that mobile-friendly ranking boost.
Yes, you can continue blocking Google bot from crawling either of them, but your rankings will suffer if you do so. And in a world where every position gained counts, it doesnβt make sense to sacrifice rankings in order to keep those files private.